🎰 Bandit AlgorithmsSpecificMulti-Armed Bandits, Thompson Sampling, UCB, Exploration Strategy, Online Learning